A Three-term Conjugate Gradient Method with a Random Parameter for Large-scale Unconstrained Optimization and its Application in Regression Model

نویسندگان

چکیده

In this paper, a new three-term conjugate gradient algorithm is proposed to solve unconstrained optimization including regression problems. We minimize the distance between search direction matrix and self-scaling memoryless BFGS in Frobenius norm determine direction, which has same advantages as quasi-Newton method. At time, random parameter used so that satisfies sufficient descent condition. For uniformly convex functions general nonlinear functions, we all establish global convergence of Numerical experiments show our method nice numerical performance for solving large-scale optimization. addition, application model proves effective.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A new hybrid conjugate gradient algorithm for unconstrained optimization

In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...

متن کامل

A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization

Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed general method. Moreover, we pres...

متن کامل

A Three-terms Conjugate Gradient Algorithm for Solving Large-Scale Systems of Nonlinear Equations

Nonlinear conjugate gradient method is well known in solving large-scale unconstrained optimization problems due to it’s low storage requirement and simple to implement. Research activities on it’s application to handle higher dimensional systems of nonlinear equations are just beginning. This paper presents a Threeterm Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear e...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

A Self-Adjusting Spectral Conjugate Gradient Method for Large-Scale Unconstrained Optimization

and Applied Analysis 3 Additionally, we assume that there exist positive constants γ and γ such that 0 < γ ≤ 󵄩󵄩󵄩gk 󵄩󵄩󵄩 ≤ γ, ∀k ≥ 1, (21) then we have the following result. Theorem2. Consider the method (2), (8) and (12), where d k is a descent direction. If (21) holds, there exist positive constants ξ 1 , ξ 2 , and ξ 3 such that relations

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Taiwanese Journal of Mathematics

سال: 2023

ISSN: ['1027-5487', '2224-6851']

DOI: https://doi.org/10.11650/tjm/230503